Gaussian bandwidth selection for manifold learning and classification
نویسندگان
چکیده
منابع مشابه
Kernel Scaling for Manifold Learning and Classification
Kernel methods play a critical role in many dimensionality reduction algorithms. They are useful in manifold learning, classification, clustering and other machine learning tasks. Setting the kernel’s scale parameter, also referred as the kernel’s bandwidth, highly affects the extracted low-dimensional representation. We propose to set a scale parameter that is tailored to the desired applicati...
متن کاملManifold Learning for Medical Image Registration, Segmentation, and Classification
The term manifold learning encompasses a class of machine learning techniques that convert data from a high to lower dimensional representation while respecting the intrinsic geometry of the data. The intuition underlying the use of manifold learning in the context of image analysis is that, while each image may be viewed as a single point in a very high-dimensional space, a set of such points ...
متن کاملLearning Rates for Classification with Gaussian Kernels
This letter aims at refined error analysis for binary classification using support vector machine (SVM) with gaussian kernel and convex loss. Our first result shows that for some loss functions, such as the truncated quadratic loss and quadratic loss, SVM with gaussian kernel can reach the almost optimal learning rate provided the regression function is smooth. Our second result shows that for ...
متن کاملRepresentation Optimization with Feature Selection and Manifold Learning in a Holistic Classification Framework
Many complex and high dimensional real-world classification problems require a carefully chosen set of features, algorithms and hyperparameters to achieve the desired generalization performance. The choice of a suitable feature representation has a great effect on the prediction performance. Manifold learning techniques – like PCA, Isomap, Local Linear Embedding (LLE) or Autoencoders – are able...
متن کاملBandwidth Selection in Kernel Density Estimators for Multiple-Resolution Classification
We consider a problem of selection of parameters in a classifier based on the average of kernel density estimators where each estimator corresponds to a different data “resolution”. The selection is based on adjusting parameters of the estimators to minimize a substitute of the misclassification ratio. We experimentally compare the misclassification ratio and parameters selected for benchmark d...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Data Mining and Knowledge Discovery
سال: 2020
ISSN: 1384-5810,1573-756X
DOI: 10.1007/s10618-020-00692-x